Multiplicative Weights with Switching Experts
نویسنده
چکیده
Due: Wednesday, April 20, 2016 – 7 pm Dropbox Outside Stata G5 Collaboration policy: collaboration is strongly encouraged. However, remember that 1. You must write up your own solutions, independently. 2. You must record the name of every collaborator. 3. You must actually participate in solving all the problems. This is difficult in very large groups, so you should keep your collaboration groups limited to 3 or 4 people in a given week. 4. Write each problem in a separate sheet and write down your name on top of every sheet. 5. No bibles. This includes solutions posted to problems in previous years.
منابع مشابه
Performance of Multiplicative-weights-updates
Let us remember the mechanics of Multiplicative-Weights-Updates: At every time t, the learner maintains a weight vector wt ≥ 0 over the experts. Given the weight vector, the probability distribution over the experts is computed as pt = wt wt·1 . The weights are initialized at w1 = 1 n · 1. (Multiplicative-weights-update step.) Given the loss vector at time t the weights are updated as follows w...
متن کاملTheory and applications of predictors that specialize
We study online learning algorithms that predict by combining the predictions of several subordinate prediction algorithms, sometimes called “experts.” These simple algorithms belong to the multiplicative weights family of algorithms. The performance of these algorithms degrades only logarithmically with the number of experts, making them particularly useful in applications where the number of ...
متن کاملBeating the Multiplicative Weights Update Algorithm
Multiplicative weights update algorithms have been used extensively in designing iterative algorithms for many computational tasks. The core idea is to maintain a distribution over a set of experts and update this distribution in an online fashion based on the parameters of the underlying optimization problem. In this report, we study the behavior of a special MWU algorithm used for generating ...
متن کاملTight Lower Bounds for Multiplicative Weights Algorithmic Families
We study the fundamental problem of prediction with expert advice and develop regret lower bounds for a large family of algorithms for this problem. We develop simple adversarial primitives, that lend themselves to various combinations leading to sharp lower bounds for many algorithmic families. We use these primitives to show that the classic Multiplicative Weights Algorithm (MWA) has a regret...
متن کاملCOS 511 : Theoretical Machine Learning
1 Review of the Bayes Algorithm Last time we talked about the Bayes algorithm, in which we give priors π i to each expert. The algorithm maintains weights w t,i for each expert. The π i values serve as the initial weights for the experts. Experts predict the distributions p t,i over the same set X, and the algorithm predicts the q t distribution as a mixture of those distributions. To restate i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016